Kullback-Leibler approximation of spectral density functions

نویسندگان

  • Tryphon T. Georgiou
  • Anders Lindquist
چکیده

We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral density is sought. In this context, we show (i) that there is a unique spectral density Φ which minimizes this Kullback-Leibler distance, (ii) that this optimal approximate is of the form Ψ/Q where the “correction term” Q is a rational spectral density function, and (iii) that the coefficients of Q can be obtained numerically by solving a suitable convex optimization problem. In the special case where Ψ = 1, the convex functional becomes quadratic and the solution is then specified by linear equations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Global Convergence Analysis of the Pavon–Ferrante Algorithm for Spectral Estimation

In this paper, we provide a detailed analysis of the global convergence properties of an extensively studied and extremely effective fixed-point algorithm for the Kullback–Leibler approximation of spectral densities, proposed by Pavon and Ferrante in [1]. Our main result states that the algorithm globally converges to one of its fixed points. Index Terms Approximation of spectral densities, spe...

متن کامل

Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil

In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Approximation by Log - Concave Distributions with Applications to Regression

We study the approximation of arbitrary distributions P on ddimensional space by distributions with log-concave density. Approximation means minimizing a Kullback–Leibler type functional. We show that such an approximation exists if, and only if, P has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on P with respect...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2003